Careers

←Job Openings

BigData/Hadoop Administrator – Weehawken, NJ Refer a Person Apply for this Job

Job Description

  • Understand, analyze and document the complex requirements of Data Refinery and perform installations, configurations, troubleshoot, performance tune, upgrade, backup and recovery of various Bigdata/Hadoop services such 2s HDFS, Hive, Impala, HBase, Spark, Kerberos.
  • Administering crucial and complex Bigdata/Hadoop infrastructure to enable client's next generation analytics and data science capabilities.
  • Engineer and operate in collaboration with client’s Big Data Operations Team, a new center of excellence across the bank for big data technologies and services based on Cloudera and effective data fabric integration.
  • Work towards enabling solutions for numerous lines of businesses of client in implementing modern big data analytics initiatives across Wealth management, investment banking and Corporate center including Risk and Finance,HR as well as Technology Services.
  • Adding immediate value by establishing big data environments based on Hadoop.
  • Innovating and evolving our big data capabilities through research and hands-on practice.
  • Operating a multi-tenant service, encompassing cluster management, security, resource and quota management, partitioning. monitoring chargeback, data governance, quality and lineage.
  • Collaborating with cross-functional teams across hardware, platform services and operations.
  • Advising users in architecting complex dig data solutions.
  • Managing complex installation and configuration of different clusters with Cloudera CDH 5.13 on RHEL and troubleshooting issues related to the clusters.
  • Integrate and troubleshoot issues related to Informatica and Hadoop environments.
  • Understanding the client's various business groups requirements, workloads and configure the cluster to be multi-tenanted sad provisioning of clusters on shared HW/storage.
  • Understand client’s security policies and standards and work as per the security standards and work towards enterprise security integration.
  • Understand and analyze the complex use cases of the bank across various lines of businesses and devise/design solutions and provision clusters on the cloud, on-premise or a hybrid cluster.
  • Support and administer complex data integrations with various DB’s like Oracle and Analytics tools like SAS and Dana Refinery.

Required Skills:

  • A minimum of bachelor's degree in computer science or equivalent.
  • Cloudrea Hadoop(CDH), Cloudera Manager, Informatica Bigdata Edition(BDM), HDFS, Yarn, MapReduce, Hive, Impala, KUDU, Sqoop, Spark, Kafka, HBase, Teradata Studio Express, Teradata, Tableau, Kerberos, Active Directory, Sentry, TLS/SSL, Linux/RHEL, Unix Windows, SBT, Maven, Jenkins, Oracle, MS SQL Server, Shell Scripting, Eclipse IDE, Git, SVN
  • Must have strong problem-solving and analytical skills
  • Must have the ability to identify complex problems and review related information to develop and evaluate options and implement solutions.

If you are interested in working in a fast-paced, challenging, fun, entrepreneurial environment and would like to have the opportunity of being a part of this fascinating industry, Send resumes. to HSTechnologies LLC, 2801 W Parker Road, Suite #5 Plano, TX - 75023 or email your resume to hr@sbhstech.com.